Theory and Applications of Simulated Annealing for Nonlinear Constrained Optimization∗
نویسندگان
چکیده
In this chapter, we present constrained simulated annealing (CSA), an algorithm that extends conventional simulated annealing to look for constrained local minima of constrained optimization problems. The algorithm is based on the theory of extended saddle points (ESPs) that shows the one-to-one correspondence between a constrained local minimum of the problem and an ESP of the corresponding penalty function. CSA finds ESPs by systematically controlling probabilistic descents in the original variable space of the penalty function and probabilistic ascents in the penalty space. Based on the decomposition of the necessary and sufficient ESP condition into multiple necessary conditions, we also describe constraint-partitioned simulated annealing (CPSA) that exploits the locality of constraints in nonlinear optimization problems. CPSA leads to much lower complexity as compared to that of CSA by partitioning the constraints of a problem into exponentially simpler subproblems, solving each independently, and resolving those violated global constraints across the subproblems. We evaluate CSA and CPSA by applying them to solve some continuous constrained optimization problems and compare their performance to that of other penalty methods. Finally, we apply CSA to solve two real-world applications, one on sensor-network placement design and another on out-of-core compiler code generation. 1 Problem Definition A general mixed-integer nonlinear programming problem (MINLP) is formulated as follows: (Pm) : min z f(z), (1) subject to h(z) = 0 and g(z) ≤ 0, where z = (x, y)T ∈ Z; x ∈ Rv and y ∈ Dw are, respectively, bounded continuous and discrete variables; f(z) is a lower-bounded objective function; g(z) = (g1(z), · · · , gr(z)) is a vector of r inequality constraint functions;1 and h(z) = (h1(z), · · · , hm(z)) is a vector ofm equality constraint functions. Functions f(z), g(z), and h(z) are general functions that can be discontinuous, nondifferentiable, and not in closed form. ∗Research supported by the National Science Foundation Grant IIS 03-12084 and a Department of Energy Early Career Principal Investigator Grant. ** Department of Electrical and Computer Engineering and the Coordinated Science Laboratory, University of Illinois, Urbana-Champaign, Urbana, IL 61801, http://manip.crhc.uiuc.edu, [email protected]. *** Department of Computer Science, Washington University, St. Louis, MO 63130. **** Synopsys, Inc., 700 East Middlefield Road, Mountain View, CA 94043. Given two vectors V1 and V2 of the same dimension, V1 ≥ V2 means that each element of V1 is greater than or equal to the corresponding element of V2; V1 > V2 means that at least one element of V1 is greater than the corresponding element of V2 and the other elements are greater than or equal to the corresponding elements of V2.
منابع مشابه
Constrained Simulated Annealing with Applications in Nonlinear Continuous Constrained Global Optimization
This paper improves constrained simulated annealing (CSA), a discrete global minimization algorithm with asymptotic convergence to discrete constrained global minima with probability one. The algorithm is based on the necessary and suucient conditions for discrete constrained local minima in the theory of discrete La-grange multipliers. We extend CSA to solve nonlinear continuous constrained op...
متن کاملConstrained genetic algorithms and their applications in nonlinear constrained optimization
This paper presents a problem-independent framework that uni es various mechanisms for solving discrete constrained nonlinear programming (NLP) problems whose functions are not necessarily di erentiable and continuous. The framework is based on the rst-order necessary and su cient conditions in the theory of discrete constrained optimization using Lagrange multipliers. It implements the search ...
متن کاملSimulated annealing with asymptotic convergence for nonlinear constrained optimization
In this paper, we present constrained simulated annealing (CSA), an algorithm that extends conventional simulated annealing to look for constrained local minima of nonlinear constrained optimization problems. The algorithm is based on the theory of extended saddle points (ESPs) that shows the one-to-one correspondence between a constrained local minimum and anESP of the corresponding penalty fu...
متن کاملTuning Strategies in Constrained Simulated Annealing for Nonlinear Global Optimization
This paper studies various strategies in constrained simulated annealing (CSA), a global optimization algorithm that achieves asymptotic convergence to constrained global minima (CGM) with probability one for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and suucient condition for discrete constrained local minima (CLM) in the theory...
متن کاملSimulated Annealing with Asymptotic Convergence for Nonlinear Constrained Global Optimization
In this paper, we present constrained simulated annealing (CSA), a global minimization algorithm that converges to constrained global minima with probability one, for solving nonlinear discrete nonconvex constrained minimization problems. The algorithm is based on the necessary and sufficient condition for constrained local minima in the theory of discrete Lagrange multipliers we developed earl...
متن کاملSimulated Annealing with Asymptotic Convergence for Nonlinear Constrained Global Optimization ? 1 Problem Deenition
In this paper, we present constrained simulated annealing (CSA), a global minimization algorithm that converges to constrained global minima with probability one, for solving nonlinear discrete non-convex constrained minimization problems. The algorithm is based on the necessary and suucient condition for constrained local minima in the theory of discrete Lagrange multipliers we developed earli...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2008